the part of complement or substitute? how ai increases the demand for human skills [pdf] that changes behavior

ref arxiv.org Complement or substitute? How AI increases the demand for human skills [pdf] 2024-12-30

I read complement or substitute? how ai increases the demand for human skills [pdf] as a constraint signal more than novelty. The link is just the anchor; the mechanics are where the leverage is (source).

see also: Compute Bottlenecks · Model Behavior

ground truth

The visible change is obvious; the deeper change is the permission it creates. I read this as a reset in expectations for teams like Compute Bottlenecks and Model Behavior. Once expectations shift, the fallback path becomes the policy.

observables

  • The way complement or substitute? how ai increases the demand for human skills [pdf] is framed compresses complexity into a single promise.
  • What looks like a surface change is actually a control move.
  • The dependency chain around complement or substitute? how ai increases the demand for human skills [pdf] is where risk accumulates, not at the surface.

keep / ignore

  • Signal: incentives now favor stability over novelty.
  • Noise: demos and commentary overstate production readiness.
  • Noise: early excitement won’t survive the next budget cycle.
  • Signal: the rollout path is designed for institutional buyers.

exposure map

  • complement or substitute? how ai increases the demand for human skills [pdf] amplifies model brittleness faster than the value it returns.
  • Governance drift turns tactical choices around complement or substitute? how ai increases the demand for human skills [pdf] into strategic liabilities.
  • The smallest edge case in complement or substitute? how ai increases the demand for human skills [pdf] becomes the largest reputational risk.

my take

My stance is pragmatic: assume the shift is real, yet delay lock in until the operational story settles.

default drift constraint signal

linkage

linkage tree
  • tags
    • #thoughtpiece
    • #ai
    • #2024
  • related
    • [[LLMs]]
    • [[Model Behavior]]

ending questions

If the incentives flipped, what would stay sticky?